Neighborhood-Adaptive Structure Augmented Metric Learning

نویسندگان

چکیده

Most metric learning techniques typically focus on sample embedding learning, while implicitly assume a homogeneous local neighborhood around each sample, based the metrics used in training ( e.g., hypersphere for Euclidean distance or unit hyperspherical crown cosine distance). As real-world data often lies low-dimensional manifold curved high-dimensional space, it is unlikely that everywhere of shares same structures input space. Besides, considering non-linearity neural networks, structure output space may not be as assumed. Therefore, representing simply with its ignoring individual would have limitations Embedding-Based Retrieval (EBR). By exploiting heterogeneity we propose Neighborhood-Adaptive Structure Augmented framework (NASA), where realized embedding, and learned along self-supervised manner. In this way, without any modifications, most indexing can to support large-scale EBR NASA embeddings. Experiments six standard benchmarks two kinds embeddings, i.e., binary embeddings real-valued show our method significantly improves outperforms state-of-the-art methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Discriminative Metric Learning by Neighborhood Gerrymandering

We formulate the problem of metric learning for k nearest neighbor classificationas a large margin structured prediction problem, with a latent variable representingthe choice of neighbors and the task loss directly corresponding to classificationerror. We describe an efficient algorithm for exact loss augmented inference, anda fast gradient descent algorithm for learning in thi...

متن کامل

Shrinkage Expansion Adaptive Metric Learning

Conventional pairwise constrained metric learning methods usually restrict the distance between samples of a similar pair to be lower than a fixed upper bound, and the distance between samples of a dissimilar pair higher than a fixed lower bound. Such fixed bound based constraints, however, may not work well when the intraand inter-class variations are complex. In this paper, we propose a shrin...

متن کامل

Multi-granularity distance metric learning via neighborhood granule margin maximization

Learning a distance metric from training samples is often a crucial step in machine learning and pattern recognition. Locality, compactness and consistency are considered as the key principles in distance metric learning. However, the existing metric learning methods just consider one or two of them. In this paper, we develop a multi-granularity distance learning technique. First, a new index, ...

متن کامل

Metric Learning with Adaptive Density Discrimination

Distance metric learning (DML) approaches learn a transformation to a representation space where distance is in correspondence with a predefined notion of similarity. While such models offer a number of compelling benefits, it has been difficult for these to compete with modern classification algorithms in performance and even in feature extraction. In this work, we propose a novel approach exp...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2022

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v36i2.20025